Structured sparsity through convex optimization

نویسندگان

  • Francis R. Bach
  • Rodolphe Jenatton
  • Julien Mairal
  • Guillaume Obozinski
چکیده

Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. While naturally cast as a combinatorial optimization problem, variable or feature selection admits a convex relaxation through the regularization by the l1-norm. In this paper, we consider situations where we are not only interested in sparsity, but where some structural prior knowledge is available as well. We show that the l1-norm can then be extended to structured norms built on either disjoint or overlapping groups of variables, leading to a flexible framework that can deal with various structures. We present applications to unsupervised learning, for structured sparse principal component analysis and hierarchical dictionary learning, and to supervised learning in the context of non-linear variable selection.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Solving Structured Sparsity Regularization with Proximal Methods

Proximal methods have recently been shown to provide effective optimization procedures to solve the variational problems defining the !1 regularization algorithms. The goal of the paper is twofold. First we discuss how proximal methods can be applied to solve a large class of machine learning algorithms which can be seen as extensions of !1 regularization, namely structured sparsity regularizat...

متن کامل

Structured Sparsity Models for Multiparty Speech Recovery from Reverberant Recordings

We tackle the multi-party speech recovery problem through modeling the acoustic of the reverberant chambers. Our approach exploits structured sparsity models to perform room modeling and speech recovery. We propose a scheme for characterizing the room acoustic from the unknown competing speech sources relying on localization of the early images of the speakers by sparse approximation of the spa...

متن کامل

How to exploit prior information in low-complexity models

Compressed Sensing refers to extracting a lowdimensional structured signal of interest from its incomplete random linear observations. A line of recent work has studied that, with the extra prior information about the signal, one can recover the signal with much fewer observations. For this purpose, the general approach is to solve weighted convex function minimization problem. In such settings...

متن کامل

Structured Sparsity: Discrete and Convex approaches

Compressive sensing (CS) exploits sparsity to recover sparse or compressible signals from dimensionality reducing, non-adaptive sensing mechanisms. Sparsity is also used to enhance interpretability in machine learning and statistics applications: While the ambient dimension is vast in modern data analysis problems, the relevant information therein typically resides in a much lower dimensional s...

متن کامل

Convex Optimization For Non-Convex Problems via Column Generation

We apply column generation to approximating complex structured objects via a set of primitive structured objects under either the cross entropy or L2 loss. We use L1 regularization to encourage the use of few structured primitive objects. We attack approximation using convex optimization over an infinite number of variables each corresponding to a primitive structured object that are generated ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1109.2397  شماره 

صفحات  -

تاریخ انتشار 2011